perron-frobenius operator
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.14)
- Europe > France > Provence-Alpes-Côte d'Azur > Bouches-du-Rhône > Marseille (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Metric on Nonlinear Dynamical Systems with Perron-Frobenius Operators
The development of a metric for structural data is a long-term problem in pattern recognition and machine learning. In this paper, we develop a general metric for comparing nonlinear dynamical systems that is defined with Perron-Frobenius operators in reproducing kernel Hilbert spaces. Our metric includes the existing fundamental metrics for dynamical systems, which are basically defined with principal angles between some appropriately-chosen subspaces, as its special cases. We also describe the estimation of our metric from finite data. We empirically illustrate our metric with an example of rotation dynamics in a unit disk in a complex plane, and evaluate the performance with real-world time-series data.
- Asia > Japan > Honshū > Kansai > Osaka Prefecture > Osaka (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.14)
- Europe > France > Provence-Alpes-Côte d'Azur > Bouches-du-Rhône > Marseille (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Data-driven approximation of transfer operators for mean-field stochastic differential equations
Ioannou, Eirini, Klus, Stefan, Reis, Gonçalo dos
Mean-field stochastic differential equations, also called McKean--Vlasov equations, are the limiting equations of interacting particle systems with fully symmetric interaction potential. Such systems play an important role in a variety of fields ranging from biology and physics to sociology and economics. Global information about the behavior of complex dynamical systems can be obtained by analyzing the eigenvalues and eigenfunctions of associated transfer operators such as the Perron--Frobenius operator and the Koopman operator. In this paper, we extend transfer operator theory to McKean--Vlasov equations and show how extended dynamic mode decomposition and the Galerkin projection methodology can be used to compute finite-dimensional approximations of these operators, which allows us to compute spectral properties and thus to identify slowly evolving spatiotemporal patterns or to detect metastable sets. The results will be illustrated with the aid of several guiding examples and benchmark problems including the Cormier model, the Kuramoto model, and a three-dimensional generalization of the Kuramoto model.
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (3 more...)
Learning dynamically inspired invariant subspaces for Koopman and transfer operator approximation
Transfer and Koopman operator methods offer a framework for representing complex, nonlinear dynamical systems via linear transformations, enabling a deeper understanding of the underlying dynamics. The spectra of these operators provide important insights into system predictability and emergent behaviour, although efficiently estimating them from data can be challenging. We approach this issue through the lens of general operator and representational learning, in which we approximate these linear operators using efficient finite-dimensional representations. Specifically, we machine-learn orthonormal basis functions that are dynamically tailored to the system. This learned basis provides a particularly accurate approximation of the operator's action as well as a nearly invariant finite-dimensional subspace. We illustrate our approach with examples that showcase the retrieval of spectral properties from the estimated operator, and emphasise the dynamically adaptive quality of the machine-learned basis.
- Oceania > Australia > New South Wales > Sydney (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Asia > Singapore (0.04)
Learning graphons from data: Random walks, transfer operators, and spectral clustering
Klus, Stefan, Bramburger, Jason J.
Many signals in the real world that evolve in time can be modeled as a stochastic process with the signal randomly jumping from one state to another as time proceeds. When the signal can only exhibit a finite number of possible states, one can interpret the evolution of the signal as a random walk on a graph with vertices representing the states of the signal and edge weights giving way to the transition probabilities from one state to another. In particular, one arrives at a Markov chain representation of the signal that can be estimated using only the signal data. However, many realistic signals can take on a continuum of values, and so the goal of this work is to present a framework for modeling continuous-space stochastic signals and to identify metastable and coherent sets via clustering techniques. We present a data-driven method to learn the discrete-time transition probabilities of stochastic signals evolving in continuous space, which can be regarded as a generalization of the discrete space case considered in [25, 22]. The underlying theory is developed by evoking the concept of a graphon, which can be defined as the limit of sequences of dense networks that grow without bound [35, 34, 21, 18]. As recently shown in [43], graphons provide a well-developed framework for extending the concepts of random walks on finite graphs to stochastic processes evolving in continuous space. For example, random walks on graphs can be used to measure the centrality of vertices, and these concepts can also be extended to graphons [4]. Our goal is to identify transition probabilities, clusters, and the graphon itself from random walk data.
- North America > United States > New York > Albany County > Albany (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Asia > Middle East > Jordan (0.04)
Convergent Methods for Koopman Operators on Reproducing Kernel Hilbert Spaces
Boullé, Nicolas, Colbrook, Matthew J., Conradie, Gustav
Data-driven spectral analysis of Koopman operators is a powerful tool for understanding numerous real-world dynamical systems, from neuronal activity to variations in sea surface temperature. The Koopman operator acts on a function space and is most commonly studied on the space of square-integrable functions. However, defining it on a suitable reproducing kernel Hilbert space (RKHS) offers numerous practical advantages, including pointwise predictions with error bounds, improved spectral properties that facilitate computations, and more efficient algorithms, particularly in high dimensions. We introduce the first general, provably convergent, data-driven algorithms for computing spectral properties of Koopman and Perron--Frobenius operators on RKHSs. These methods efficiently compute spectra and pseudospectra with error control and spectral measures while exploiting the RKHS structure to avoid the large-data limits required in the $L^2$ settings. The function space is determined by a user-specified kernel, eliminating the need for quadrature-based sampling as in $L^2$ and enabling greater flexibility with finite, externally provided datasets. Using the Solvability Complexity Index hierarchy, we construct adversarial dynamical systems for these problems to show that no algorithm can succeed in fewer limits, thereby proving the optimality of our algorithms. Notably, this impossibility extends to randomized algorithms and datasets. We demonstrate the effectiveness of our algorithms on challenging, high-dimensional datasets arising from real-world measurements and high-fidelity numerical simulations, including turbulent channel flow, molecular dynamics of a binding protein, Antarctic sea ice concentration, and Northern Hemisphere sea surface height. The algorithms are publicly available in the software package $\texttt{SpecRKHS}$.
- North America > United States (0.14)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- Asia > Malaysia (0.04)
Deep learning with kernels through RKHM and the Perron-Frobenius operator
Reproducing kernel Hilbert C * -module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of C * -algebra, and the Perron-Frobenius operator is a linear operator related to the composition of functions. We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators. By virtue of C * -algebra, the dependency of the bound on output dimension is milder than existing bounds. We show that C * -algebra is a suitable tool for deep learning with kernels, enabling us to take advantage of the product structure of operators and to provide a clear connection with convolutional neural networks. Our theoretical analysis provides a new lens through which one can design and analyze deep kernel methods.